Goto

Collaborating Authors

 Easton


Learning to See More: UAS-Guided Super-Resolution of Satellite Imagery for Precision Agriculture

Masrur, Arif, Olsen, Peder A., Adler, Paul R., Jackson, Carlan, Myers, Matthew W., Sedghi, Nathan, Weil, Ray R.

arXiv.org Artificial Intelligence

Unmanned Aircraft Systems (UAS) and satellites are key data sources for precision agriculture, yet each presents trade-offs. Satellite data offer broad spatial, temporal, and spectral coverage but lack the resolution needed for many precision farming applications, while UAS provide high spatial detail but are limited by coverage and cost, especially for hyperspectral data. This study presents a novel framework that fuses satellite and UAS imagery using super-resolution methods. By integrating data across spatial, spectral, and temporal domains, we leverage the strengths of both platforms cost-effectively. We use estimation of cover crop biomass and nitrogen (N) as a case study to evaluate our approach. By spectrally extending UAS RGB data to the vegetation red edge and near-infrared regions, we generate high-resolution Sentinel-2 imagery and improve biomass and N estimation accuracy by 18% and 31%, respectively. Our results show that UAS data need only be collected from a subset of fields and time points. Farmers can then 1) enhance the spectral detail of UAS RGB imagery; 2) increase the spatial resolution by using satellite data; and 3) extend these enhancements spatially and across the growing season at the frequency of the satellite flights. Our SRCNN-based spectral extension model shows considerable promise for model transferability over other cropping systems in the Upper and Lower Chesapeake Bay regions. Additionally, it remains effective even when cloud-free satellite data are unavailable, relying solely on the UAS RGB input. The spatial extension model produces better biomass and N predictions than models built on raw UAS RGB images. Once trained with targeted UAS RGB data, the spatial extension model allows farmers to stop repeated UAS flights. While we introduce super-resolution advances, the core contribution is a lightweight and scalable system for affordable on-farm use.


Dr. Frank Rosenblatt Dies at 43; Taught Neurobiology at Cornell - The New York Times

#artificialintelligence

Frank Rosenblatt, associate pro fessor of neurobiology at Cor nell University, died here yes terday in a boating accident. It was his 43d birthday. He lived in Brooktondale, N. Y., an Ithaca suburb. An originator of perception theory, he had developed an experimental machine that could be trained to identify automatically objects or pat terns such as letters of the al phabet. The instrument was an electromechanical device con sisting of a sensory unit of photo cells that viewed the pat tern shown to the machine, as sociation units that contained the machine's memory and re sponse units that displayed vis ually its pattern‐recognition re sponse.


Lost in Translation

Communications of the ACM

Aaron Hertzman's Viewpoint "Computers Do Not Make Art, People Do," (May 2020, p. 45) makes excellent points as to why it is very unlikely that computers will ever replace artists. While I don't think he quite stated such, it appears to me that he may be of the opinion that replacement of (natural) intelligence (of human beings) with artificial intelligence is very unlikely. Most, if not all, of the endeavors we are addressing are based on digital technology, and possibly cannot replace analog entities. It is unfortunate, however, that with the hype these days, people are either unaware of reality, or simply ignoring reality, with undesirable consequences. I like to cite a voicemail transcription I received recently.